Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Image super-resolution reconstruction based on parallel convolution and residual network
Huifeng WANG, Yan XU, Yiming WEI, Huizhen WANG
Journal of Computer Applications    2022, 42 (5): 1570-1576.   DOI: 10.11772/j.issn.1001-9081.2021050742
Abstract312)   HTML15)    PDF (2730KB)(113)       Save

The existing image super-resolution reconstruction algorithms can improve the overall visual effect of the image or promote the objective evaluation value of the reconstructed image, but have poor balanced improvement effect of image perception effect and objective evaluation value, and the reconstructed images lack high-frequency information, resulting in texture blur. Aiming at the above problems, an image super-resolution reconstruction algorithm based on parallel convolution and residual network was proposed. Firstly, taking the parallel structure as the overall framework, different convolution combinations were used on the parallel structure to enrich the feature information, and the jump connection was added to further enrich the feature information and fuse the output to extract more high-frequency information. Then, an adaptive residual network was introduced to supplement information and optimize network performance. Finally, perceptual loss was used to improve the overall quality of the restored image. Experimental results show that, compared with the algorithms such as Super-Resolution Convolutional Neural Network (SRCNN), Very Deep convolutional network for Super-Resolution (VDSR) and Super-Resolution Generative Adversarial Network (SRGAN), the proposed algorithm has better performance in image reconstruction and has clearer detail texture of the enlarged effect image. In the objective evaluation, the Peak Signal-To-Noise Ratio (PSNR) and Structural SIMilarity (SSIM) of the proposed algorithm in × 4 reconstruction are improved by 0.25 dB and 0.019 averagely and respectively compared with those of SRGAN.

Table and Figures | Reference | Related Articles | Metrics